AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-stage training

# Multi-stage training

Olmo 2 0325 32B
Apache-2.0
OLMo 2 32B is the largest 32B-parameter model in the open language model series released by the Allen Institute for AI (AI2). It is open-sourced under the Apache 2.0 license and supports English language processing.
Large Language Model Transformers English
O
allenai
2,246
47
Mistral 7B Instruct Ukrainian
Apache-2.0
An open-source large language model optimized for Ukrainian, built through a three-stage training process including fine-tuning, model merging, and direct preference optimization
Large Language Model Transformers
M
SherlockAssistant
1,443
22
Sambert
This is a Hebrew embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Transformers Other
S
MPA
149
2
Klue Sroberta Base Continue Learning By Mnr
This is a Korean sentence embedding model trained on KLUE/NLI and KLUE/STS datasets, utilizing the sentence-transformers framework and optimized for sentence similarity tasks through two-stage training.
Text Embedding Transformers Korean
K
bespin-global
88.10k
32
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase